Deep Convolutional Neural Nets

نویسنده

  • Carlo Tomasi
چکیده

The activation can be rewritten as follows a = vx+ b where v = [w1, . . . , wD] and b = wD+1 , and is an inner product between a weight vector v and the input x plus a bias b. For different inputs x of the same magnitude1, the activation is maximum when x is parallel to v, and the latter can be viewed as a pattern or template to which x is compared. The bias b then raises or lowers the activation before it is passed through the activation function. As measured by its Euclidean norm ‖x‖.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimation of Hand Skeletal Postures by Using Deep Convolutional Neural Networks

Hand posture estimation attracts researchers because of its many applications. Hand posture recognition systems simulate the hand postures by using mathematical algorithms. Convolutional neural networks have provided the best results in the hand posture recognition so far. In this paper, we propose a new method to estimate the hand skeletal posture by using deep convolutional neural networks. T...

متن کامل

Knowledge Transfer in Deep Convolutional Neural Nets

Knowledge transfer is widely held to be a primary mechanism that enables humans to quickly learn new complex concepts when given only small training sets. In this paper, we apply knowledge transfer to deep convolutional neural nets, which we argue are particularly well suited for knowledge transfer. Our initial results demonstrate that components of a trained deep convolutional neural net can c...

متن کامل

Node Specificity in Convolutional Deep Nets Depends on Receptive Field Position and Size

In convolutional deep neural networks, receptive field (RF) size increases with hierarchical depth. When RF size approaches full coverage of the input image, different RF positions result in RFs with different specificity, as portions of the RF fall out of the input space. This leads to a departure from the convolutional concept of positional invariance and opens the possibility for complex for...

متن کامل

Very Deep Convolutional Networks for Text Classification

The dominant approach for many NLP tasks are recurrent neural networks, in particular LSTMs, and convolutional neural networks. However, these architectures are rather shallow in comparison to the deep convolutional networks which are very successful in computer vision. We present a new architecture for text processing which operates directly on the character level and uses only small convoluti...

متن کامل

Convolutional deep rectifier neural nets for phone recognition

Rectifier neurons differ from standard ones only in that the sigmoid activation function is replaced by the rectifier function, max(0, x). Several recent studies suggest that rectifier units may be more suitable building units for deep nets. For example, we found that with deep rectifier networks one can attain a similar speech recognition performance than that with sigmoid nets, but without th...

متن کامل

A multi-scale convolutional neural network for automatic cloud and cloud shadow detection from Gaofen-1 images

The reconstruction of the information contaminated by cloud and cloud shadow is an important step in pre-processing of high-resolution satellite images. The cloud and cloud shadow automatic segmentation could be the first step in the process of reconstructing the information contaminated by cloud and cloud shadow. This stage is a remarkable challenge due to the relatively inefficient performanc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015